skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Hartzog, Woodrow"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Lawmakers have started to regulate “dark patterns,” understood to be design practices meant to influence technology users’ decisions through manipulative or deceptive means. Most agree that dark patterns are undesirable, but open questions remain as to which design choices should be subjected to scrutiny, much less the best way to regulate them. In this Article, we propose adapting the concept of dark patterns to better fit legal frameworks. Critics allege that the legal conceptualizations of dark patterns are overbroad, impractical, and counterproductive. We argue that law and policy conceptualizations of dark patterns suffer from three deficiencies: First, dark patterns lack a clear value anchor for cases to build upon. Second, legal definitions of dark patterns overfocus on individuals and atomistic choices, ignoring de minimis aggregate harms and the societal implications of manipulation at scale. Finally, the law has struggled to articulate workable legal thresholds for wrongful dark patterns. To better regulate the designs called dark patterns, lawmakers need a better conceptual framing that bridges the gap between design theory and the law’s need for clarity, flexibility, and compatibility with existing frameworks. We argue that wrongful self-dealing is at the heart of what most consider to be “dark” about certain design patterns. Taking advantage of design affordances to the detriment of a vulnerable party is disloyal. To that end, we propose disloyal design as a regulatory framing for dark patterns. In drawing from established frameworks that prohibit wrongful self-dealing, we hope to provide more clarity and consistency for regulators, industry, and users. Disloyal design will fit better into legal frameworks and better rally public support for ensuring that the most popular tools in society are built to prioritize human values. 
    more » « less
    Free, publicly-accessible full text available June 30, 2026
  2. Free, publicly-accessible full text available April 25, 2026
  3. In recent years, gig work platforms have gained popularity as a way for individuals to earn money; as of 2021, 16% of Americans have at some point earned money from such platforms. Despite their popularity and their history of unfair data collection practices and worker safety, little is known about the data collected from workers (and users) by gig platforms and about the privacy dark pattern designs present in their apps. This paper presents an empirical measurement of 16 gig work platforms' data practices in the U.S. We analyze what data is collected by these platforms, and how it is shared and used. Finally, we consider how these practices constitute privacy dark patterns. To that end, we develop a novel combination of methods to address gig-worker-specific challenges in experimentation and data collection, enabling the largest in-depth study of such platforms to date. We find extensive data collection and sharing with 60 third parties—including sharing reversible hashes of worker Social Security Numbers (SSNs)—along with dark patterns that subject workers to greater privacy risk and opportunistically use collected data to nag workers in off-platform messages. We conclude this paper with proposed interdisciplinary mitigations for improving gig worker privacy protections. After we disclosed our SSN-related findings to affected platforms, the platforms confirmed that the issue had been mitigated. This is consistent with our independent audit of the affected platforms. Analysis code and redacted datasets will be made available to those who wish to reproduce our findings. 
    more » « less
    Free, publicly-accessible full text available January 1, 2026
  4. Privacy law is failing to protect individuals from being watched and exposed, despite stronger surveillance and data protection rules. The problem is that our rules look to social norms to set thresholds for privacy violations, but people can get used to being observed. In this Article, we argue that by ignoring de minimis privacy encroachments, the law is complicit in normalizing surveillance. Privacy law helps acclimate people to being watched by ignoring smaller, more frequent, and more mundane privacy diminutions. We call these reductions “privacy nicks,” like the proverbial “thousand cuts” that lead to death. Privacy nicks come from the proliferation of cameras and biometric sensors on doorbells, glasses, and watches, and the drift of surveillance and data analytics into new areas of our lives like travel, exercise, and social gatherings. Under our theory of privacy nicks as the Achilles heel of surveillance law, invasive practices become routine through repeated exposures that acclimate us to being vulnerable and watched in increasingly intimate ways. With acclimation comes resignation, and this shift in attitude biases how citizens and lawmakers view reasonable measures and fair tradeoffs. Because the law looks to norms and people’s expectations to set thresholds for what counts as a privacy violation, the normalization of these nicks results in a constant renegotiation of privacy standards to society’s disadvantage. When this happens, the legal and social threshold for rejecting invasive new practices keeps getting redrawn, excusing ever more aggressive intrusions. In effect, the test of what privacy law allows is whatever people will tolerate. There is no rule to stop us from tolerating everything. This Article provides a new theory and terminology to understand where privacy law falls short and suggests a way to escape the current surveillance spiral. 
    more » « less
  5. Internet-of-Things (IoT) devices are ubiquitous, but little attention has been paid to how they may incorporate dark patterns despite consumer protections and privacy concerns arising from their unique access to intimate spaces and always-on capabilities. This paper conducts a systematic investigation of dark patterns in 57 popular, diverse smart home devices. We update manual interaction and annotation methods for the IoT context, then analyze dark pattern frequency across device types, manufacturers, and interaction modalities. We find that dark patterns are pervasive in IoT experiences, but manifest in diverse ways across device traits. Speakers, doorbells, and camera devices contain the most dark patterns, with manufacturers of such devices (Amazon and Google) having the most dark patterns compared to other vendors. We investigate how this distribution impacts the potential for consumer exposure to dark patterns, discuss broader implications for key stakeholders like designers and regulators, and identify opportunities for future dark patterns study. 
    more » « less
  6. Dark patterns are user interface elements that can influence a person's behavior against their intentions or best interests. Prior work identified these patterns in websites and mobile apps, but little is known about how the design of platforms might impact dark pattern manifestations and related human vulnerabilities. In this paper, we conduct a comparative study of mobile application, mobile browser, and web browser versions of 105 popular services to investigate variations in dark patterns across modalities. We perform manual tests, identify dark patterns in each service, and examine how they persist or differ by modality. Our findings show that while services can employ some dark patterns equally across modalities, many dark patterns vary between platforms, and that these differences saddle people with inconsistent experiences of autonomy, privacy, and control. We conclude by discussing broader implications for policymakers and practitioners, and provide suggestions for furthering dark patterns research. 
    more » « less
  7. null (Ed.)
    Automated systems like self-driving cars and “smart” thermostats are a challenge for fault-based legal regimes like negligence because they have the potential to behave in unpredictable ways. How can people who build and deploy complex automated systems be said to be at fault when they could not have reasonably anticipated the behavior (and thus risk) of their tools? 
    more » « less